Numerical Experience with a Reduced Hessian Method for Large Scale Constrained Optimization

نویسندگان

  • Lorenz T. Biegler
  • Jorge Nocedal
  • Claudia Schmid
  • David Ternet
چکیده

We propose a quasi-Newton algorithm for solving large optimization problems with nonlinear equality constraints. It is designed for problems with few degrees of freedom, and is motivated by the need to use sparse matrix factorizations. The algorithm incorporates a correction vector that approximates the cross term Z^WYpy in order to estimate the curvature in both the range and null spaces of the constraints. The algorithm can be considered to be, in some sense, a practical implementation of an algorithm of Coleman and Conn. We give conditions under which local and superlinear convergence is obtained.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

OPTIMIZATION TECHNOLOGY CENTER Argonne National Laboratory and Northwestern University NUMERICAL EXPERIENCE WITH A REDUCED HESSIAN METHOD FOR LARGE SCALE CONSTRAINED OPTIMIZATION by

The reduced Hessian SQP algorithm presented in is developed in this paper into a practical method for large scale optimization The novelty of the algorithm lies in the incorporation of a correction vector that approximates the cross term ZWY pY This improves the stability and robustness of the algorithm without increasing its computational cost The paper studies how to implement the algorithm e...

متن کامل

A Limited-Memory Multipoint Symmetric Secant Method for Bound Constrained Optimization

A new algorithm for solving smooth large-scale minimization problems with bound constraints is introduced. The way of dealing with active constraints is similar to the one used in some recently introduced quadratic solvers. A limited-memory multipoint symmetric secant method for approximating the Hessian is presented. Positive-definiteness of the Hessian approximation is not enforced. A combina...

متن کامل

Numerical Experiments with Methods for Solving the KKT Equations

In many seqential quadratic programming algorithms for constrained optimization the calculation of an effective search direction depends on the (estimated) Hessian of the Lagrangian being positive definite on the null space of the active constraints. This paper reports some numerical experience with two techniques for checking the properties of the Hessian and, if necessary, modifying it during...

متن کامل

SNOPT: An SQP Algorithm for Large-Scale Constrained Optimization

Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first derivatives are available and that the constraint gradients are sparse. We discuss an SQP algorithm th...

متن کامل

SNOPT : An SQP Algorithm for Large - Scale Constrained Optimization ∗ Philip

Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first derivatives are available and that the constraint gradients are sparse. Second derivatives are assumed...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • SIAM Journal on Optimization

دوره 5  شماره 

صفحات  -

تاریخ انتشار 1995